2,119 research outputs found

    An exploratory study of imagining sounds and “hearing” music in autism

    Get PDF
    Individuals with autism spectrum disorder (ASD) reportedly possess preserved or superior music-processing skills compared to their typically developing counterparts. We examined auditory imagery and earworms (tunes that get “stuck” in the head) in adults with ASD and controls. Both groups completed a short earworm questionnaire together with the Bucknell Auditory Imagery Scale. Results showed poorer auditory imagery in the ASD group for all types of auditory imagery. However, the ASD group did not report fewer earworms than matched controls. These data suggest a possible basis in poor auditory imagery for poor prosody in ASD, but also highlight a separability between auditory imagery and control of musical memories. The separability is present in the ASD group but not in typically developing individuals

    The Effect of Liquidity on Governance

    Get PDF
    This paper studies the effect of stock liquidity on blockholders’ choice of governance mechanisms. We focus on hedge funds as they are unconstrained by legal restrictions and business ties, and thus have all governance channels at their disposal. Since the threat of governance, not just actual governance, can discipline managers, we use Section 13 filings to measure governance intent rather than only studying instances of actual governance. We find that liquidity increases the likelihood that a hedge fund acquires a block in a firm. Conditional upon acquiring a stake, liquidity reduces the likelihood that a blockholder governs through voice (intervention) – as evidenced by the greater propensity to file Schedule 13Gs (passive investment) rather than 13Ds (active investment). Liquidity is more likely to lead to a 13G filing if the manager’s wealth is sensitive to the stock price, consistent with governance through exit (trading). A 13G filing leads to positive announcement returns, especially in liquid firms. These two results suggest that liquidity does not dissuade blockholders from governing altogether, but instead encourages them to govern through exit rather than voice. We use decimalization as an exogenous shock to liquidity to identify causal effects.

    Enhanced Genre Classification through Linguistically Fine-Grained POS Tags

    Get PDF

    Automated Alignment in Multilingual Corpora

    Get PDF

    Adjective Density as a Text Formality Characteristic for Automatic Text Classification: A Study Based on the British National Corpus

    Get PDF
    PACLIC 23 / City University of Hong Kong / 3-5 December 200

    DigGAN: Discriminator gradIent Gap Regularization for GAN Training with Limited Data

    Full text link
    Generative adversarial nets (GANs) have been remarkably successful at learning to sample from distributions specified by a given dataset, particularly if the given dataset is reasonably large compared to its dimensionality. However, given limited data, classical GANs have struggled, and strategies like output-regularization, data-augmentation, use of pre-trained models and pruning have been shown to lead to improvements. Notably, the applicability of these strategies is 1) often constrained to particular settings, e.g., availability of a pretrained GAN; or 2) increases training time, e.g., when using pruning. In contrast, we propose a Discriminator gradIent Gap regularized GAN (DigGAN) formulation which can be added to any existing GAN. DigGAN augments existing GANs by encouraging to narrow the gap between the norm of the gradient of a discriminator's prediction w.r.t.\ real images and w.r.t.\ the generated samples. We observe this formulation to avoid bad attractors within the GAN loss landscape, and we find DigGAN to significantly improve the results of GAN training when limited data is available. Code is available at \url{https://github.com/AilsaF/DigGAN}.Comment: Accepted to NeurIPS 202

    Application Of Statistics In Engineering Technology Programs

    Get PDF
    Statistics is a critical tool for robustness analysis, measurement system error analysis, test data analysis, probabilistic risk assessment, and many other fields in the engineering world. Traditionally, however, statistics is not extensively used in undergraduate engineering technology (ET) programs, resulting in a major disconnect from industry expectations. The research question: How to effectively integrate statistics into the curricula of ET programs, is in the foundation of this paper. Based on the best practices identified in the literature, a unique “learning-by-using” approach was deployed for the Electronics Engineering Technology Program at Texas A&M University. Simple statistical concepts such as standard deviation of measurements, signal to noise ratio, and Six Sigma were introduced to students in different courses. Design of experiments (DOE), regression, and the Monte Carlo method were illustrated with practical examples before the students applied the newly understood tools to specific problems faced in their engineering projects. Industry standard software was used to conduct statistical analysis on real results from lab exercises. The result from a pilot project at Texas A&M University indicates a significant increase in using statistics tools in course projects by students.  Data from student surveys in selected classes indicate that students gained more confidence in statistics.   These preliminary results show that the new approach is very effective in applying statistics to engineering technology programs

    eSpaceML: An Event-Driven Spatial Annotation Framework

    Get PDF

    A Corpus-Based Quantitative Study of Nominalizations across Chinese and British Media English

    Get PDF
    corecore